Introduction
It is a sad fact that accidents happen, and that when they do sometimes people die. Thankfully these events are rare in the UK, but do happen, and one accident, or one fatality is one too many.
Humans are the single biggest cause of accidents and incidents, with approximately 80% of aviation industry accidents in 2003 caused by some form of human error (Rankin 2003). The majority of these accidents were given the major proximal cause of operator (pilot) error, but almost invariably some form of maintenance or other engineering error was a causal factor.
Safety should be the main driver of all engineering personnel, and the desire to reduce the number of incidents to zero should be a major deciding factor in the actions of all personnel involved in the engineering industry – engineers, managers or others.
Over this section we will look at the study of human/technology interaction, and then how it is applied in aviation – both engineering and flight operations – problems that are yet to be surmounted, along with defences against human induced mistakes. Aviation is used as a benchmark industry, as it is probably the field of engineering in which human factors has achieved the highest level of study and application.
By the end of this section you will be able to evaluate:
- the history of human factors
- the application of crew resources management to non-aviation situations
- the significance of the “Dave” effect
- the effectiveness and application of the Bow Tie method
What are Human Factors?
There are many different ways in which human factors is defined, with the American Federal Aviation Administration (FAA) describing human factors programmes as those which focus on the people that perform their work with the intention of addressing a variety of factors that affect them, such as physical, psychosocial and physiological. They should also attempt to focus on the capabilities of the individual and should seek to match the task to the person rather than a one size fits all approach (FAA 2018).
This definition would seem slightly unfamiliar to many that work in the engineering industry, who would perhaps more broadly describe human factors as methods of attempting to reduce the impact of human mistakes within industry, and indeed this working definition is probably adequate in many cases.
However, human factors are a much broader field than either of these definitions would suggest.
One of the first things to be noted is that, outside of industry, discussion of ergonomics is as likely to be heard as that of human factors. Originally, there was a difference between the two fields, but now the terms tend to be used interchangeably. In the past, ergonomics referred to the scientific study of the measurement of humans (length of arm, height, and so on) and how they interacted with machines. Over time, it became obvious that there was far more than just the physical that affected how well humans and machines interacted, and this introduction of psychological concepts lead to the development of the separate field of human factors. Finally, it became obvious that this separation of fields was somewhat arbitrary, and it was removed, leading to the current situation where the terms are interchangeable.
The International Ergonomics Association (IEA) gives the following definition:
“Ergonomics (or human factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data and methods to design to optimize human well-being and overall system performance” (2014)
The intention of human factors is not just to reduce incidents in industry, but rather it is largely the drive to make things better for humans. This does of course include reducing accidents, but it also ranges from such things as comfortable design of office workstations all the way through to efficient transfer of babies from operation rooms to intensive care.
It also includes the understanding and acceptance of the fallibility and fragility of humans, and their faults and flaws. Rather than judging these or seeking to apportion blame, human factors and ergonomics seeks to reduce the chance for mistakes, and to reduce the impact of them when they do occur.
Ergonomically Designed Child's Chair
History of Human Factors
Because human factors and ergonomics looks at the study of humans and how they interact with and seek to improve the environment they have created for themselves, there is an argument that ergonomics can be traced back to the very earliest humans, but this is stretching the point somewhat. However, there is strong evidence that the ancient Greeks employed what would be recognised as ergonomic principles in the design of their tools and workplaces. Hippocrates gives a full description of how a surgeon should design their workplace, lay out their tools and design their workflows, that is near indistinguishable from the modern equivalent (Maramaras et al 1999).
There was further work done in the 17th and 18th Centuries around common injuries to workers, but, in reality, modern ergonomics began in the 19th Century with Taylor’s Scientific Management method, which sought to optimise every process (Drury 1918).
Taylor’s theories were generally rejected by the researchers from Soviet Union, largely due to the way in which they sought to turn man into a machine. Rather, they said that the ideal was not to be found within Taylor’s theories as they were purely about economic developments rather than development in a broader sense such as intellectual. Bekhterev and Myasishchev (1921) instead suggested that a much more important drive was towards organising processes to ensure efficiency, but with a minimum of hazards and in a manner that seeks to promote health and development of the working people.
At this point, the development of human factors shifted to the aviation industry, and it was aviation that drove the study, and indeed continues to do so – along with the medical industry.
Prior to the first world war, the focus was on the pilot – seeking the characteristics that created the ideal aviator. However, the rigours of the war shifted the focus away from the pilot and onto the aircraft itself, such as how the cockpit was laid out, and seeking to design displays that were easily read. Studies also began to investigate the effect of environmental factors such as altitude and weather. The development of the aeronautical labs at Brooks and Wright–Patterson air force bases in the USA was a significant development at this time, as they sought to define why, in the same situation, one pilot may succeed where another may fail.
A further development at this time – and perhaps a surprising one – is that the effect of illumination on productivity was investigated, with the unsurprising discovery that humans work best in appropriately illuminated workplaces. The more surprising discovery, however, was what has been dubbed the Hawthorne effect. This basically shows that productivity will increase due to novelty. If the worker feels they are being listened to, they will be more productive – for example it does not matter if the lighting is increased or decreased, as long as you ask the workers beforehand. More recent work however has also suggested that productivity increases when workers think they are being observed – no matter who by – which makes the job of the researcher difficult, as the mere fact they are carrying out research may well be enough to cause productivity to go up!
With the increasing complexity of machines brought about by the Second World War, Taylor’s concept of matching people to jobs was no longer feasible. It became apparent that instead machines would need to be designed taking human limitations into account. As such there was a significant amount of work to be done to establish exactly what these limitations are – in the broadest of senses as the variation between individuals can be enormous.
This research moved outside of aviation and into other industries, who sought to improve controls and displays to make operation easier – not out of kindness, but for the economic benefit it would bring. It is also around this time that the terms human factors and ergonomics first appear.
After the war, the US Army Air Force was able to publish a 19-volume summary about what had been learned during the war. If a summary of a work extends to 19 volumes, one can only image the size of the unedited version. Significantly though, the work contained the proof of Chapanis from 1943, that a logical and easy to understand cockpit greatly reduced accidents, which in turn helped explain why the best pilots, flying the best aircraft, could still crash.
This work continued to develop and flourish, particularly with the enormous amounts of money which was poured into defence research during the cold war. The vast sums that were given to universities meant that research was carried out on the smallest pieces of equipment through to entire systems, and whilst much of this research proved to be of little use to the military, it has been put to good use in civilian industries, from the design of power station control rooms, to the moulded shape of training chopsticks.
As with almost all fields, the information age has not just driven on traditional human factors research, but it has also led to the development of the whole new field of human – computer interaction. The development of more advanced machines has led to step change developments in body mapping and other fields, which in turn has driven development of products as diverse as the ergonomic keyboard, through full body play and training suits for the military, to the huge sums spent on design of individual boots for professional sports people.
Human factors have also lead to increased interaction and partnership working across the most surprising of professions, including for example the partnership between Great Ormond Street Hospital and the Ferrari F1 team, where the expertise of the F1 team allowed Great Ormond Street to streamline and simplify the hand-over from theatre to intensive care, which not only increased efficiency, but also increased the likelihood of good recovery, whilst reducing the chances of mortality during the process (Sower et al 2008).
Crew Resources Management
A specific subsect of human factors, Crew Resource Management (CRM), is a method of training that is used where human error (human factors) can have a devastating effect on safety. As may be expected, it was developed within aviation, and indeed it is aviation that continues to be the place where it is most likely to be encountered.
It began to be developed in the 1950s, but was not adopted until the 1980s, with the term cockpit resource management – later generalised into crew resource management – not even appearing until 1979.
The concept of CRM is simple. It seeks to create an environment in which the hierarchy is maintained, but in which lower ranks are encouraged to voice doubts when needed and seek clarification if appropriate. The Tenerife Airport Disaster of 1977 was the major driver, with NASA formally endorsing the training a matter of weeks after the event which killed almost 600.
The complex series of events that occurred leading to the disaster are examined in the following documentary:
Initially focused on the flight deck, United were the first airline to offer the training, but within a decade it had become global standard. United had also expanded the training beyond the flight deck, bringing in cabin crew to provide an additional layer of safety, whilst enhancing communication and teamwork. So effective was CRM that it was expanded into other fields, including aircraft maintenance, where it continues to grow and flourish.
CRM is a broad church of training, encompassing a wide range of knowledge and skills, from teamwork to problem solving, and communication to decision making. It can be defined as a system that seeks to use the human resources available to promote and increase safety within the workplace.
It is not concerned with technical knowledge or skills – they are assumed to be adequate – but rather it looks at the mental process used to solve problems, make decision and gain and maintain situational awareness. It also considers interpersonal skills, that is the communication and other behavioural skills associated with teamwork. Often these skills overlap with each other and with technical skills, but this is unimportant.
Most aviation regulatory bodies mandate that pilots must undergo CRM training, and whilst it is neither mandated nor industry standard in maintenance and engineering, it is becoming more common as airlines seek to reduce the number of costly mistakes made when the aircraft is in the hangar or during the design process.
Skills and Application of CRM
CRM training seeks to enhance a variety of personal attributes, namely communication, assertiveness, decision making, situational and self-awareness, flexibility and adaptability and leadership and teamwork. The primary goal is to seek to create an environment where authority can be respectfully questioned and challenged. It also seeks for this to happen as soon as it appears that an error is occurring, and drives the idea that some difference between what is happening and what should be happening is often (if not always) the first sign that something is not right.
The issue is that, in most organisations, it is difficult to question authority whilst still maintaining respect, and just as importantly, showing that respect has been maintained. For this reason, communication techniques need to be taught both to junior and senior members of staff, with senior staff learning that being questioned is not (necessarily) being threatened and junior staff learning the appropriate way to question their superiors.
CRM is a large and complex subject, that could be (and is) a training programme in itself, however the ability to communicate assertively but respectfully is at its core. Bishop (2002) has distilled this down to a 5-step process:
- Opening: Address the individual- “Boss”, “Captain”, “Hey Bob” are all appropriate, and should be spoken in a clear and authoritative way. Use whatever salutation is most likely to get their attention but remain respectful.
- State the concern: State what’s bothering you in a direct manner and be honest about your emotions – “I’m worried we’re going to run out of fuel” is a lot more effective than “Think we’ve got enough fuel”.
- State the problem: This can only be done as you see it “There’s only half an hour of fuel left”.
- Give a solution: “We can divert to airport x and refuel” is likely to work. Without a solution (even if foolish) it can sound like whining.
- Obtain buy in: “Do you think that sounds good?” Once the boss has bought in, they are unlikely to change their mind.
Whilst this seems very straightforward – and indeed it is – it can be very difficult to do, especially as generally we have been conditioned, from a young age, not to question authority. Changes in interpersonal skills, and even in personality, may be required.
There are a significant number of case studies available – both good and bad – that show the benefits of effective CRM and what can go wrong when it fails.
United Airlines flight 173 is a good example from before CRM was introduced, Air France flight 447 is a good example of what can go wrong and QANTAS flight 32 is an excellent example of when it works well.
An in-depth discussion is not needed here, but the accident reports of these and others are well worthwhile reading if time allows.
QANTAS Flight 32 Engine Damage
The “Dave” Effect
Famous author Terry Pratchett had previously worked as a press officer for the nuclear industry, and once famously said (2014):
“Eight years involved with the nuclear industry have taught me that when nothing can possibly go wrong and every avenue has been covered, then is the time to buy a house on the next continent”
It should of course be remembered that Pratchett was not a nuclear physicist, and as a writer of humour, was inclined to make a joke in all situations, but he did spend 8 years working in the press department of various sections of the UK’s nuclear power industry and saw first-hand what could go wrong. His favourite example of this was what he termed the “Dave” effect (name changed for privacy).
The principle of the Dave effect is simple – a hard-working, efficient, dedicated worker who is not kept fully informed is as likely to cause problems as a lazy and disinterested one. In fact, more so, as their hard-working nature means they will have more chance to get things wrong, and their dedication means that they will be inclined to look for potential money saving opportunities.
Imagine this as an example. A system is designed with three fail safe alarms, all with separate sensors, and all powered by their own supply. This means that now there is a situation whereby the chance of anything going wrong is vanishingly small.
Now let’s imagine that Dave is tasked with fitting three separate cables to three separate UPS (Uninterruptible Power Supplies). Nobody tells Dave why he is fitting these three cables to these three supplies, nor what the purpose of the cables in question are, at all.
Upon examining the work order, Dave realises that it will be quicker, cheaper and easier for him to instead drill one hole through the wall, and fix all three cables into a single, larger UPS. The specification of the three individual UPS is exceeded, the number of holes through the wall is reduced, the job gets done faster, so Dave gets to go home early, and the company has saved money. Everyone is happy.
The problem is obvious now, however. A system that was previously doubly redundant now has a single point of failure, and a failure within the UPS could render the entire system useless.
This is a real problem. The desire to do well and seem like a good worker can cause personnel to make changes to work orders. Not to take shortcuts or cut corners, but rather to improve the product as they see it.
The defence against this is three-fold. First, all personnel must be encouraged to follow work orders and other paperwork to the letter. If they find an issue or spot an improvement, they should be encouraged to report these, and rewarded for doing so. However, should they be found to have strayed from work orders then they must be disciplined for doing so. The second layer of defence is training and communication. Tell staff why they are doing a task, not just what they are doing, and explain the importance of their individual task. This has another benefit – a feeling of inclusion tends to increase both productivity and quality of work. Finally, the best method of all is to design a system that is intrinsically safe and intrinsically Dave free.
When and if a situation like this comes to light there are things that must be done. First and foremost, rectify the situation. Fixing a fault is always more important than apportioning blame. Next, the problem must be investigated, but it must be remembered that Dave was working with the best of intentions, and until shown otherwise, should be considered as much a victim as anyone else. Finally, lessons must not just be learned, but also shared. Where else has Dave done this is the obvious one, but even in other places checks must be made, as generally if an idea occurs to one person it has also occurred to someone else.
The image here shows the New Safe Confinement at Chernobyl nuclear power plant, an engineering masterpiece that was needed to protect the environment from the harmful materials at Chernobyl, and power plant that went critical in 1982, even though such an event was impossible. This impossible event lead to the deaths of 54 people in the immediate aftermath from acute radiation sickness and other causes, with unknown potential long-term health issues due to increased risk of certain cancers and other effects.
Construction of the New Safe Confinement began in 2010, at a cost of €2.1 Billion – a lot of money to clear up an accident that in theory could not happen and, in reality, may not have happened if the operators had been better trained.
The Bow – Tie Method
Bowtie is one of a group of methodologies designed to help prevent risks becoming incidents or problems. As such, they are known collectively as risk barrier methods. Bowtie is extremely common, and the CAA make comment as to how useful they have found it as method.
Bowtie could be a full section in itself, so this must, by its nature, be only a very superficial investigation.
Bowtie is highly effective as a methodology, as it is a visual tool, which depicts risk in an identifiable way, and provides the opportunity to give an in depth assessment of the key barriers both to stopping the unwanted event, and also to reducing the impact of the unsafe event should it occur. Some of the things bowtie can provide are (CAA, 2015):
- Effective visual depiction of risk
- Increased awareness and understanding of the risk
- Best practice guidance for safety risk management
- Identification of critical risk controls
- Assessment of control effectiveness
- Balanced risk overview for the whole system
A bowtie model uses different elements to build up a picture of the risk. The whole picture revolves around the hazard and the top event – release of control of the hazard. After this point we consider threats and consequences, and the controls we have put in place. Controls can go on either side of the picture; on the left are preventative measures which will either eliminate the threat entirely or prevent the threat from causing the top event recovery. On the right are measures designed to reduce the likelihood of a given consequence or mitigate the severity of that consequence.
Bowtie can be difficult to visualise, but the picture produced will look something like that in the figure below – although usually vastly more complex.
The link provides access to a video that will give an in-depth exploration of Bowtie
.
A simple Bowtie Diagram
(click to enlarge)
Summary
Humans are both the biggest cause of mistakes in any industry, and also the best defence against such mistakes. Whilst it is true that humans will continue to make errors, steps can be taken to reduce the number of them, or the impact of those that are made.
Effective application of risk barrier methods along with other accident reduction and safety approaches is not just vital in terms of reducing overall harm to the individual and society as a whole, but also in terms of ensuring long-term financial sustainability.
References
Bekhterev, V. and Myasishchev, V. N. (1921) Quoted by Moray, N (2005). Ergonomics: The History and Scope of Human Factors. Abingdon on Thames: Routledge
Bishop, T. (2003) Crew Resource Management: A Positive Chance for the Fire Service. Fairfax: International Association of Fire Chiefs.
CAA (2015). What does Bowtie Show? [online]. Available at https://www.caa.co.uk/Safety-initiatives-and-resources/Working-with-industry/Bowtie/About-Bowtie/What-does-bowtie-show-/ [1st Novemeber 2019]
Drury, H. B. (1918) Scientific Management: A History and Criticism. New York: Columbia University
Fitts, P. M. and Jones, R. E. (1947) Psychological Aspects of Instrument Display: Analysis of 270 “Pilot Error” Experiences in Reading and Interpreting Aircraft Instruments. Dayton: US Air Forces
FAA (2018) Aircraft Maintenance Technician Handbook – General. Oklahoma City: US Department of Transport.
International Ergonomics Association (2014) Definition and Domains of Ergonomics [online]. Available at https://www.iea.cc/whats/index.html [1st November 2019]
Marmaras, N.; Poulakakis, G.; Papakostopoulos, V. (1999) Ergonomic Design in Ancient Greece, Applied Ergonomics. Volume 30 (4), pp 361 – 368
Pratchett, T.2(014) A Slip of the Keyboard. London: Anchor Books
Rankin, W. (2003) MEDA Investigation Process, Aero Magazine. Qtr. 2, Article 3 [online]. Available from https://www.boeing.com/commercial/aeromagazine/articles/qtr_2_07/article_03_1.html [10th March 2020]
Sower, V. E., Duffy, J. A. and Kohers, G. (2008) Ferrari’s Formula One Handovers and Handovers From Surgery to Intensive Care [online]. Available at https://pdfs.semanticscholar.org/d309/739f673039e1f3be7655b0ee3b381aefc2c9.pdf [4th November 2019]